This week marks the 50th anniversary of the first atom bomb test. Science Editor Roger
Highfield explains how the quest to avoid nuclear weapons testing has led to powerful
supercomputers and the 'fantasy' bomb With nuclear weapons issues, you don't want surprises
AT LEAST 2,000 nuclear tests have been conducted since the Manhattan Project ushered
in the atomic age half a century ago by exploding the most devastating weapon hitherto known. As testing becomes financially and politically more expensive, scientists increasingly choose to simulate explosions to determine the effectiveness of nuclear bombs.
These "fantasy" nuclear tests take place inside powerful supercomputers; the Sandia National Laboratories in New Mexico are at the forefront of such research.
Dr Ed Barsis, director of computer sciences at Sandia, said: "The US commitment to ending underground testing, constraints on non-nuclear testing, and loss of production capability call for greatly increased reliance on computer simulations to verify the safety, reliability, and performance of the nuclear stockpile."
The history of the atomic bomb has been inextricably entwined with that of the computer since that first intense flash of the Trinity test lit up the sky over Alamogordo, New Mexico, on July 16, 1945.
Faced with the threat of Nazi Germany developing an atomic bomb, America determined to win the race by bringing together a 2,500-strong team at Los Alamos, the focus of the Manhattan
Project. Among them, John Von Neumann realised the importance of virtual testing when faced with the task of designing the explosive lenses used for compressing the plutonium at the heart of Trinity, a realisation that would spur the development of the ENIAC, a milestone in electronic computing.
To make an atomic bomb explode, scientists exploit the way fat atomic nuclei fall apart to release energy and subatomic particles called neutrons. In "fission" weapons, balls of uranium or plutonium are squashed, so the neutrons are more likely to strike atomic nuclei and split them. At a given point, a critical mass forms when enough free-flying neutrons are present to trigger a chain reaction and unleash vast quantities of energy.
Von Neumann used a series of equations that describe the flow properties of a fluid to model the compression. But solving them was extremely laborious. This inspired him to suggest a
heuristic approach, in which a computer would be used to explore the mathematics. Half a century later, there is a complex symbiosis between theory, bomb tests and computer
experiments. To simulate a nuclear blast, even supercomputers need data. For a new type of warhead, there are no easy alternatives to real data - what happens in an explosion. However, to verify the effectiveness and reliability of an already tested bomb design, nuclear test data can be augmented in cunning ways.
These include hydrodynamic experiments, which reveal the behaviour of a dummy warhead under such pressure and shock that non-fissile isotopes of plutonium and uranium behave like liquids. There are also hydronuclear experiments, in effect, aborted explosions perilouslyclose to the real thing.
During experiments, scientists use sensors and X-ray machines to take a snapshot, and even holograms, of the core of the weapon during this implosion. After putting the data into computers, it is possible to investigate how well the high explosive burns, whether it compresses the nuclear explosive symmetrically, and how the nuclear explosive behaves. Such virtual weapons testing is under way at Los Alamos and Lawrence Livermore National Laboratories in America.
For its simulations to test the non-nuclear part of the weapons, their sister laboratory, Sandia, relies on CTH and PCTH, a family of codes for simulating complex phenomena in high-velocity shock physics, from modelling the collision between Jupiter and the Shoemaker-Levy comet last year to predicting how tank armour will withstand a shell.
The codes are put to work on its Intel Paragon XP/S computer, the world's most powerful production supercomputer. Dr Mike McGlaun, manager of computational physics, said: "We've seen a 15- to 20-fold speed-up over the other supercomputers available to us. Calculations that took over a month, we do over a weekend."
The weapons are designed so that accidental detonation of their high explosives does not lead to a nuclear bang. However, simulations have to check what could happen in a silo full of weapons, where fragments moving at thousands of miles per hour could set off all the explosives.
To create the debris cloud in a virtual explosion, size and detail of the simulation must be specified. At Sandia, the computational mesh had to be fine enough to resolve the fragments, each about a centimetre in diameter. At the same time, it had to cover a large enough area to include the weapon containers. The result was a simulation of 10 million cells that ran overnight through the supercomputer. The analyst modelled the movement of debris, and its interaction with an adjacent container. Three-dimensional calculations showed that the structure of the container would help mitigate damage to neighbouring weapons. "When you're talking about nuclear weapons issues, you don't want any surprises," Dr McGlaun said.
Sandia scientists also want to use finer resolution to provide greater confidence in the scientific results. The finer zone size enables them to move from simulations of an idealised weapon - a perfect sphere of nuclear explosive - to include asymmetries, such as joints, tubes and wires.
For full-scale virtual testing, the scientists say they need orders of magnitude more powerful than even the Paragon can provide. To run problems with 64 million cells and with more complex physics, they are hungry for computers with vastly more memory, and up to 20 times more computational power.
New software is also needed. For example, CTH and PCTH are what their developers call
"bang and splat codes", designed for events occurring at supersonic velocities. One of the codes being tested on the Paragon, ALEGRA, combines bang and splat physics with "shake, rattle and roll" physics, which describe events occurring at car-crash velocities. ALEGRA will enable scientists to analyse, for example, how the shock wave from a projectile striking a nuclear weapon might damage it.